Finite Mixture Model of Bounded Semi-naive Bayesian Networks Classifier
نویسندگان
چکیده
The Semi-Naive Bayesian network (SNB) classifier, a probabilistic model with an assumption of conditional independence among the combined attributes, shows a good performance in classification tasks. However, the traditional SNBs can only combine two attributes into a combined attribute. This inflexibility together with its strong independency assumption may generate inaccurate distributions for some datasets and thus may greatly restrict the classification performance of SNBs. In this paper we develop a Bounded Semi-Naive Bayesian network (B-SNB) model based on direct combinatorial optimization. Our model can join any number of attributes within a given bound and maintains a polynomial time cost at the same time. This improvement expands the expressive ability of the SNB and thus provide potentials to increase accuracy in classification tasks. Further, aiming at relax the strong independency assumption of the SNB, we then propose an algorithm to extend the B-SNB into a finite mixture structure, named Mixture of Bounded Semi-Naive Bayesian network (MBSNB). We give theoretical derivations, outline of the algorithm, analysis of the algorithm and a set of experiments to demonstrate the usefulness of MBSNB in classification tasks. The novel finite MBSNB network shows a better classification performance in comparison with than other types of classifiers in this paper.
منابع مشابه
Finite Mixture Model of Bounded Semi-Naive Bayesian Networks for Classification
The Naive Bayesian (NB) network classifier, a probabilistic model with a strong assumption of conditional independence among features, shows a surprisingly competitive prediction performance even when compared with some state-of-the-art classifiers. With a looser assumption of conditional independence, the Semi-Naive Beyesian (SNB) network classifier is superior to NB classifiers when features ...
متن کاملA Bayesian Network Classifier that Combines a Finite Mixture Model and a NaIve Bayes Model
In this paper we present a new Bayesian net work model for classification that combines the naive Bayes (NB} classifier and the fi nite mixture (FM} classifier. The resulting classifier aims at relaxing the strong assump tions on which the two component models are based, in an attempt to improve on their classification performance, both in terms of accuracy and in terms of calibration of the...
متن کاملStability evaluation of Neural and statistical Classifiers based on Modified Semi - bounded Plug - in Algorithm
This paper illustrates a new criterion for evaluating neural networks stability compared to the Bayesian classifier. The stability comparison is performed by the error rate probability densities estimation using the modified semi-bounded Plug-in algorithm. We attempt, in this work, to demonstrate that the Bayesian approach for neural networks improves the performance and stability degree of the...
متن کاملThe Use of the Modified Semi - bounded Plug - in Algorithm to Compare Neural and Bayesian Classifiers Stability
Despite of the widespread use of the neural networks in the industrial applications, their mathematical formulation remains difficult to analyze. This explains a limited amount of work that formally models their classification volatility. Referring to the statistical point of view, we attempt in this work to evaluate the classical and Bayesian neural networks stability degree compared to the st...
متن کاملFlexible and Robust Bayesian Classification by Finite Mixture Models
The regularized Mahalanobis distance is proposed in the framework of finite mixture models to avoid commonly faced numerical difficulties encountered with EM. Its principle is applied to Gaussian and Student-t mixtures, resulting in reliable density estimates, the model complexity being kept low. Besides, the regularized models are robust to various noise types. Finally, it is shown that the qu...
متن کامل